Goto

Collaborating Authors

 deepsparse repo


Sparse Transformers

#artificialintelligence

Originally published on Towards AI the World's Leading AI and Technology News and Media Company. If you are building an AI-related product or service, we invite you to consider becoming an AI sponsor. At Towards AI, we help scale AI and technology startups. Let us help you unleash your technology to the masses. If you want to analyze how fast 19 sparse BERT models perform inference, you'll only need a YAML file and 16GB of RAM to find out.


The NLP Cypher

#artificialintelligence

With the great engineering minds at Neural Magic, we're all actively attempting to solve a very difficult problem. How do we get these large models into production without blowing up our hardware or our wallet? We all want the same robust performance with our deep learning models. We want them to be accurate, as light as possible, and fast. So… how do we achieve this?